35 research outputs found
Continuous Multiclass Labeling Approaches and Algorithms
We study convex relaxations of the image labeling problem on a continuous
domain with regularizers based on metric interaction potentials. The generic
framework ensures existence of minimizers and covers a wide range of
relaxations of the originally combinatorial problem. We focus on two specific
relaxations that differ in flexibility and simplicity -- one can be used to
tightly relax any metric interaction potential, while the other one only covers
Euclidean metrics but requires less computational effort. For solving the
nonsmooth discretized problem, we propose a globally convergent
Douglas-Rachford scheme, and show that a sequence of dual iterates can be
recovered in order to provide a posteriori optimality bounds. In a quantitative
comparison to two other first-order methods, the approach shows competitive
performance on synthetical and real-world images. By combining the method with
an improved binarization technique for nonstandard potentials, we were able to
routinely recover discrete solutions within 1%--5% of the global optimum for
the combinatorial image labeling problem
Duality-based Higher-order Non-smooth Optimization on Manifolds
We propose a method for solving non-smooth optimization problems on
manifolds. In order to obtain superlinear convergence, we apply a Riemannian
Semi-smooth Newton method to a non-smooth non-linear primal-dual optimality
system based on a recent extension of Fenchel duality theory to Riemannian
manifolds. We also propose an inexact version of the Riemannian Semi-smooth
Newton method and prove conditions for local linear and superlinear
convergence. Numerical experiments on l2-TV-like problems confirm superlinear
convergence on manifolds with positive and negative curvature
Sublabel-Accurate Relaxation of Nonconvex Energies
We propose a novel spatially continuous framework for convex relaxations
based on functional lifting. Our method can be interpreted as a
sublabel-accurate solution to multilabel problems. We show that previously
proposed functional lifting methods optimize an energy which is linear between
two labels and hence require (often infinitely) many labels for a faithful
approximation. In contrast, the proposed formulation is based on a piecewise
convex approximation and therefore needs far fewer labels. In comparison to
recent MRF-based approaches, our method is formulated in a spatially continuous
setting and shows less grid bias. Moreover, in a local sense, our formulation
is the tightest possible convex relaxation. It is easy to implement and allows
an efficient primal-dual optimization on GPUs. We show the effectiveness of our
approach on several computer vision problems